A Modified Recursive Regularization Factor Calculation for Sparse RLS Algorithm with l1-Norm

نویسندگان

چکیده

In this paper, we propose a new calculation method for the regularization factor in sparse recursive least squares (SRLS) with l1-norm penalty. The proposed requires no prior knowledge of actual system impulse response, and it also reduces computational complexity by about half. simulation, use Mean Square Deviation (MSD) to evaluate performance SRLS, using factor. simulation results demonstrate that SRLS shows difference less than 2 dB MSD from conventional true response. Therefore, is confirmed very similar existing method, even half complexity.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Recursive Algorithm for L1 Norm Estimation in Linear Models

L1 norm estimator has been widely used as a robust parameter estimation method for outlier detection. Different algorithms have been applied for L1 norm minimization among which the linear programming problem based on the simplex method is well known. In the present contribution, in order to solve an L1 norm minimization problem in a linear model, an interior point algorithm is developed which ...

متن کامل

Sparse Trace Norm Regularization

We study the problem of estimating multiple predictive functions from a dictionary of basis functions in the nonparametric regression setting. Our estimation scheme assumes that each predictive function can be estimated in the form of a linear combination of the basis functions. By assuming that the coefficient matrix admits a sparse low-rank structure, we formulate the function estimation prob...

متن کامل

Least Squares Optimization with L1-Norm Regularization

This project surveys and examines optimization approaches proposed for parameter estimation in Least Squares linear regression models with an L1 penalty on the regression coefficients. We first review linear regression and regularization, and both motivate and formalize this problem. We then give a detailed analysis of 8 of the varied approaches that have been proposed for optimizing this objec...

متن کامل

Efficient L1/Lq Norm Regularization

Sparse learning has recently received increasing attention in many areas including machine learning, statistics, and applied mathematics. The mixed-norm regularization based on the l1/lq norm with q > 1 is attractive in many applications of regression and classification in that it facilitates group sparsity in the model. The resulting optimization problem is, however, challenging to solve due t...

متن کامل

Regularization of the RLS Algorithm

SUMMARY Regularization plays a fundamental role in adaptive filtering. There are, very likely, many different ways to regularize an adaptive filter. In this letter, we propose one possible way to do it based on a condition that makes intuitively sense. From this condition, we show how to regularize the recursive least-squares (RLS) algorithm.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Mathematics

سال: 2021

ISSN: ['2227-7390']

DOI: https://doi.org/10.3390/math9131580